Multimodal Sentence Intelligibility and the Detection of Auditory-Visual Asynchrony in Speech and Nonspeech Signals: A First Report
نویسندگان
چکیده
The ability to perceive and understand visual-only speech and the benefit experienced from having both auditory and visual signals available during speech perception tasks varies widely in the normal-hearing population. At the present time, little is known about the underlying neural mechanisms responsible for this variability or the possible relationships between multisensory speech perception abilities and performance on other perceptual or cognitive tasks. Previous studies have hypothesized that lipreading ability and auditory-visual (AV) benefit measures might be positively correlated with the ability to detect asynchronies in AV signals. Good integrators might be more attuned to detailed temporal relationships between auditory and visual information. However, this hypothesis has not been explicitly tested in normal-hearing individuals. In the present investigation, 30 normal-hearing participants were given a modified clinical test of sentence intelligibility (the CUNY sentences) under auditory-only, visual-only, and auditory-visual (AV) presentation conditions. The same participants also performed an AV asynchrony detection task using both speech and nonspeech multimodal stimuli that varied over a range of temporal asynchronies. The results suggest a relationship between auditory-only, visual-only, and AV sentence intelligibility measures and the ability to detect AV asynchrony in speech and nonspeech signals. Implications for AV integration in speech perception are discussed.
منابع مشابه
Detection of Auditory-Visual Asynchrony in Speech and Nonspeech Signals
Two experiments were conducted to examine the temporal limitations on the detection of asynchrony in auditory-visual (AV) signals. Each participant made asynchrony judgments about speech and nonspeech signals presented over an 800-ms range of AV onset asynchronies. Consistent with previous findings, all conditions revealed a wide window of several hundred milliseconds over which AV signals were...
متن کاملAudiovisual asynchrony detection for speech and nonspeech signals
This study investigated the “intersensory temporal synchrony window” [1] for audiovisual (AV) signals. A speeded asynchrony detection task was used to measure each participant’s temporal synchrony window for speech and nonspeech signals over an 800-ms range of AV asynchronies. Across three sets of stimuli, the video-leading threshold for asynchrony detection was larger than the audio-leading th...
متن کاملAudiovisual asynchrony detection in human speech.
Combining information from the visual and auditory senses can greatly enhance intelligibility of natural speech. Integration of audiovisual speech signals is robust even when temporal offsets are present between the component signals. In the present study, we characterized the temporal integration window for speech and nonspeech stimuli with similar spectrotemporal structure to investigate to w...
متن کاملDetection of auditory (cross-spectral) and auditory-visual (cross-modal) synchrony
Detection thresholds for temporal synchrony in auditory and auditory-visual sentence materials were obtained on normal-hearing subjects. For auditory conditions, thresholds were determined using an adaptive-tracking procedure to control the degree of temporal asynchrony of a narrow audio band of speech, both positive and negative in separate tracks, relative to three other narrow audio bands of...
متن کاملبررسی وضوح گفتار کودکان فلج مغزی اسپاستیک 8 تا 12 ساله
Background and purpose: Speech intelligibility refers to how speech is understandable by listeners. This study examined speech intelligibility in children (Persian native speakers) with spastic cerebral palsy aged 8-12 years old. Materials and methods: A cross-sectional study was performed in 31dysarthric students (….. boys and …..girls) in Tehran, 2014. A list of w...
متن کامل